The Gradient Projection Method with Exact Line Search

نویسندگان

  • William W. Hager
  • Soonchul Park
چکیده

The gradient projection algorithm for function minimization is often implemented using an approximate local minimization along the projected negative gradient. On the other hand, for some difficult combinational optimization problems, where a starting guess may be far from a solution, it may be advantageous to perform a nonlocal (exact) line search. In this paper we show how to evaluate the piece-wise smooth projection associated with a constraint set described by bounds on the variables and a single linear equation. When the NP hard graph partitioning problem is formulated as a continuous quadratic programming problem, the constraints have this structure. Mathematics Subject Classifications (2000). 90C52, 65K05, 65Y20.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

A conjugate Rosen’s gradient projection method with global line search for piecewise linear optimization∗

The Kelley cutting plane method is one of the methods commonly used to optimize the dual function in the Lagrangian relaxation scheme. Usually the Kelley cutting plane method uses the simplex method as the optimization engine. It is well known that the simplex method leaves the current vertex, follows an ascending edge and stops at the nearest vertex. What would happen if one would continue the...

متن کامل

On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions

We consider the gradient (or steepest) descent method with exact line search applied to a strongly convex function with Lipschitz continuous gradient. We establish the exact worst-case rate of convergence of this scheme, and show that this worst-case behavior is exhibited by a certain convex quadratic function. We also extend the result to a noisy variant of gradient descent method, where exact...

متن کامل

Box-Constrained Approach for Hard Permutation Problems

Algorithm 3 details the process for incrementally computing term (13) for all xk. (The process for computing (14) is similar.) Computation of the full gradient is thus also an O(nm) operation. Using this technique, we can apply full-gradient first-order methods efficiently, including gradient projection and Frank-Wolfe. With an appropriate line-search method, gradient projection is guaranteed t...

متن کامل

Simple examples for the failure of Newton's method with line search for strictly convex minimization

In this paper two simple examples of a twice continuously differentiable strictly convex function f are presented for which Newton’s method with line search converges to a point where the gradient of f is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function f is defined as well as a sequence of descent directions for...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Global Optimization

دوره 30  شماره 

صفحات  -

تاریخ انتشار 2004